Search results for "Systems Integration"
showing 10 items of 10 documents
Preparing for CRIS: Challenges and Opportunities for Systems Integration at Finnish Universities
2014
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014 This paper raises issues presented by integrating administrative research information to institutional repository (IR) and other systems, especially in the context of Finnish universities. The observations are based on preparing for procurement of a Current Research Information System (CRIS) at the University of Jyväskylä. The CRIS will be used by various stakeholders in different organizational units, having conflicting requirements and different notions on system usage. For example, national reporting affects considerably the publication cataloging conventions. Determining the optimal data flow for handling publications,…
An integrated approach based on uniform quantization for the evaluation of complexity of short-term heart period variability: Application to 24 h Hol…
2007
We propose an integrated approach based on uniform quantization over a small number of levels for the evaluation and characterization of complexity of a process. This approach integrates information-domain analysis based on entropy rate, local nonlinear prediction, and pattern classification based on symbolic analysis. Normalized and non-normalized indexes quantifying complexity over short data sequences (â¼300 samples) are derived. This approach provides a rule for deciding the optimal length of the patterns that may be worth considering and some suggestions about possible strategies to group patterns into a smaller number of families. The approach is applied to 24 h Holter recordings of …
Human digital twins and cognitive mimetic
2021
Digital twins – digital models of technical systems and processes – have recently been introduced to work with complex industrial processes. Yet should such models concern only physical objects (as definitions of them often imply), or should users and other human beings also be included? Models that include people have been called human digital twins (HDTs); they facilitate more accurate analyses of technologies in practical use. The cognitive mimetic approach can be used to describe human interactions with technologies. This approach analyses human information processes such as perceiving and thinking to mimic how people process information in order to design intelligent technologies. The …
A new interface to couple thin-layer chromatography with laser desorption/atmospheric pressure chemical ionization mass spectrometry for plate scanni…
2005
An interface to allow on-line qualitative and quantitative full-plate detection and analysis of compounds separated by thin-layer chromatography (TLC) is presented. A continuous wave diode laser is employed as a desorption source. Atmospheric pressure chemical ionization mass spectrometry ionizes and subsequently identifies the desorbed sample molecules. Besides direct laser desorption on untreated TLC plates, graphite particles were used as a matrix to couple in the laser power and improve the efficiency of desorption.
GPCALMA: A Grid-based tool for mammographic screening
2005
The next generation of High Energy Physics (HEP) experiments requires a GRID approach to a distributed computing system and the associated data management: the key concept is the Virtual Organisation (VO), a group of distributed users with a common goal and the will to share their resources. A similar approach is being applied to a group of Hospitals which joined the GPCALMA project (Grid Platform for Computer Assisted Library for MAmmography), which will allow common screening programs for early diagnosis of breast and, in the future, lung cancer. HEP techniques come into play in writing the application code, which makes use of neural networks for the image analysis and proved to be useful…
ballaxy: web services for structural bioinformatics.
2014
Abstract Motivation: Web-based workflow systems have gained considerable momentum in sequence-oriented bioinformatics. In structural bioinformatics, however, such systems are still relatively rare; while commercial stand-alone workflow applications are common in the pharmaceutical industry, academic researchers often still rely on command-line scripting to glue individual tools together. Results: In this work, we address the problem of building a web-based system for workflows in structural bioinformatics. For the underlying molecular modelling engine, we opted for the BALL framework because of its extensive and well-tested functionality in the field of structural bioinformatics. The large …
Inter-Model Consistency and Complementarity: Learning from ex-vivo Imaging and Electrophysiological Data towards an Integrated Understanding of Cardi…
2011
International audience; Computational models of the heart at various scales and levels of complexity have been independently developed, parameterised and validated using a wide range of experimental data for over four decades. However, despite remarkable progress, the lack of coordinated efforts to compare and combine these computational models has limited their impact on the numerous open questions in cardiac physiology. To address this issue, a comprehensive dataset has previously been made available to the community that contains the cardiac anatomy and fibre orientations from magnetic resonance imaging as well as epicardial transmembrane potentials from optical mapping measured on a per…
Pervasive access to MRI bias artifact suppression service on a grid.
2009
Bias artifact corrupts magnetic resonance images in such a way that the image is afflicted by illumination variations. Some of the authors proposed the Exponential Entropy Driven - Homomorphic Unsharp Masking (E2D-HUM) algorithm that corrects this artifact without any a priori hypothesis about the tissues or the Magnetic Resonance image modality. Moreover, E2D-HUM does not care about the body part under examination and does not require any particular training task. People who want to use this algorithm, which is Matlab-based, have to set their own computers in order to execute it. Furthermore, they have to be Matlab-skilled to exploit all the features of the algorithm. In our work we propos…
Applying Genre-Based Ontologies to Enterprise Architecture
2007
This paper elaborates the approach of using ontologies as a conceptual base for enterprise architecture (EA) descriptions. The method focuses on recognising and modelling business critical information concepts, their content, and semantics used to operate the business. Communication genres and open and semi-structured information need interviews are used as a domain analysis method. Ontologies aim to explicate the results of domain analysis and to provide a common reference model for Business Information Architecture (BIA) descriptions. The results are generalised to model further aspects of EA. peerReviewed
Combined TL and 10B-alanine ESR dosimetry for BNCT
2004
The dosimetric technique described in this paper is based on electron spin resonance (ESR) detectors using an alanine-boric compound acid enriched with 1 0 B, and beryllium oxide thermoluminescent (TL) detectors; with this combined dosimetry, it is possible to discriminate the doses due to thermal neutrons and gamma radiation in a mixed field. Irradiations were carried out inside the thermal column of a TRIGA MARK II water-pool-type research nuclear reactor, also used for Boron Neutron Capture therapy (BNCT) applications, with thermal neutron fluence from 10 9 to 10 1 4 n t h cm - 2 . The ESR dosemeters using the alanine-boron compound indicated ESR signals about 30-fold stronger than those…